New results on subgradient methods for strongly convex optimization problems with a unified analysis

نویسنده

  • Masaru Ito
چکیده

We develop subgradientand gradient-based methods for minimizing strongly convex functions under a notion which generalizes the standard Euclidean strong convexity. We propose a unifying framework for subgradient methods which yields two kinds of methods, namely, the Proximal Gradient Method (PGM) and the Conditional Gradient Method (CGM), unifying several existing methods. The unifying framework provides tools to analyze the convergence of PGMs and CGMs for non-smooth, (weakly) smooth, and further for structured problems such as the inexact oracle models. The proposed subgradient methods yield optimal PGMs for several classes of problems and yield optimal and nearly optimal CGMs for smooth and weakly smooth problems, respectively.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Incremental Subgradients for Constrained Convex Optimization: A Unified Framework and New Methods

We present a unifying framework for nonsmooth convex minimization bringing together -subgradient algorithms and methods for the convex feasibility problem. This development is a natural step for -subgradient methods in the direction of constrained optimization since the Euclidean projection frequently required in such methods is replaced by an approximate projection, which is often easier to co...

متن کامل

Error Stability Properties of Generalized Gradient-Type Algorithms

We present a unified framework for convergence analysis of generalized subgradient-type algorithms in the presence of perturbations. A principal novel feature of our analysis is that perturbations need not tend to zero in the limit. It is established that the iterates of the algorithms are attracted, in a certain sense, to an e-stationary set of the problem, where e depends on the magnitude of ...

متن کامل

Incremental Gradient, Subgradient, and Proximal Methods for Convex Optimization: A Survey

We survey incremental methods for minimizing a sum ∑m i=1 fi(x) consisting of a large number of convex component functions fi. Our methods consist of iterations applied to single components, and have proved very effective in practice. We introduce a unified algorithmic framework for a variety of such methods, some involving gradient and subgradient iterations, which are known, and some involvin...

متن کامل

Lecture 2: Subgradient Methods

In this lecture, we discuss first order methods for the minimization of convex functions. We focus almost exclusively on subgradient-based methods, which are essentially universally applicable for convex optimization problems, because they rely very little on the structure of the problem being solved. This leads to effective but slow algorithms in classical optimization problems, however, in la...

متن کامل

On The Behavior of Subgradient Projections Methods for Convex Feasibility Problems in Euclidean Spaces

We study some methods of subgradient projections for solving a convex feasibility problem with general (not necessarily hyperplanes or half-spaces) convex sets in the inconsistent case and propose a strategy that controls the relaxation parameters in a specific self-adapting manner. This strategy leaves enough user-flexibility but gives a mathematical guarantee for the algorithm's behavior in t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Comp. Opt. and Appl.

دوره 65  شماره 

صفحات  -

تاریخ انتشار 2016